Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Abstract Accurate historical records of Earth’s surface temperatures are central to climate research and policy development. Widely-used estimates based on instrumental measurements from land and sea are, however, not fully consistent at either global or regional scales. To address these challenges, we develop the Dynamically Consistent ENsemble of Temperature (DCENT), a 200-member ensemble of monthly surface temperature anomalies relative to the 1982–2014 climatology. Each DCENT member starts from 1850 and has a 5° × 5° resolution. DCENT leverages several updated or recently-developed approaches of data homogenization and bias adjustments: an optimized pairwise homogenization algorithm for identifying breakpoints in land surface air temperature records, a physics-informed inter-comparison method to adjust systematic offsets in sea-surface temperatures recorded by ships, and a coupled energy balance model to homogenize continental and marine records. Each approach was published individually, and this paper describes a combined approach and its application in developing a gridded analysis. A notable difference of DCENT relative to existing temperature estimates is a cooler baseline for 1850–1900 that implies greater historical warming.more » « less
- 
            Stratigraphic correlation underpins all understanding of Earth’s history, yet few geoscientists have access to, or expertise in, numerical codes that can generate reproducible, optimal (in a least-squares framework) alignments between two stratigraphic time-series data sets. Here we introduce Align, a user-friendly computer app that makes accessible a published dynamic time warping (DTW) algorithm that, in a minute or less, catalogs a library of alignments between two time-series data sets by systematically exploring assumptions about the temporal overlap and relative sedimentation rates between the two stratigraphic sections. The Align app, written in the free, open-source R programming language, utilizes a graphical user interface (e.g., drop-down menus for data upload and sliding bars for parameter exploration) such that no coding is required. In addition to generating alignment libraries, a user can employ Align to visualize, explore, and cull each alignment library according to thresholds on Pearson’s correlation coefficient and/or temporal overlap. Here we demonstrate Align with time-series records of carbonate stable carbon isotope composition, though Align can, in principle, align any two quantitative stratigraphic time-series data sets.more » « less
- 
            Abstract A major uncertainty in reconstructing historical sea surface temperature (SST) before the 1990s involves correcting for systematic offsets associated with bucket and engine-room intake temperature measurements. A recent study used a linear scaling of coastal station-based air temperatures (SATs) to infer nearby SSTs, but the physics in the coupling between SATs and SSTs generally gives rise to more complex regional air–sea temperature differences. In this study, an energy-balance model (EBM) of air–sea thermal coupling is adapted for predicting near-coast SSTs from coastal SATs. The model is shown to be more skillful than linear-scaling approaches through cross-validation analyses using instrumental records after the 1960s and CMIP6 simulations between 1880 and 2020. Improved skill primarily comes from capturing features reflecting air–sea heat fluxes dominating temperature variability at high latitudes, including damping high-frequency wintertime SAT variability and reproducing the phase lag between SSTs and SATs. Inferred near-coast SSTs allow for intercalibrating coastal SAT and SST measurements at a variety of spatial scales. The 1900–40 mean offset between the latest SST estimates available from the Met Office (HadSST4) and SAT-inferred SSTs range between −1.6°C (95% confidence interval: [−1.7°, −1.4°C]) and 1.2°C ([0.8°, 1.6°C]) across 10° × 10° grids. When further averaged along the global coastline, HadSST4 is significantly colder than SAT-inferred SSTs by 0.20°C ([0.07°, 0.35°C]) over 1900–40. These results indicate that historical SATs and SSTs involve substantial inconsistencies at both regional and global scales. Major outstanding questions involve the distribution of errors between our intercalibration model and instrumental records of SAT and SST as well as the degree to which coastal intercalibrations are informative of global trends. Significance Statement To evaluate the consistency of instrumental surface temperature estimates before the 1990s, we develop a coupled energy-balance model to intercalibrate measurements of sea surface temperature (SST) and station-based air temperature (SAT) near global coasts. Our model captures geographically varying physical regimes of air–sea coupling and outperforms existing methods in inferring regional SSTs from SAT measurements. When applied to historical temperature records, the model indicates significant discrepancies between inferred and observed SSTs at both global and regional scales before the 1960s. Our findings suggest remaining data issues in historical temperature archives and opportunities for further improvements.more » « less
- 
            Abstract Land surface air temperatures (LSAT) inferred from weather station data differ among major research groups. The estimate by NOAA’s monthly Global Historical Climatology Network (GHCNm) averages 0.02°C cooler between 1880 and 1940 than Berkeley Earth’s and 0.14°C cooler than the Climate Research Unit estimates. Such systematic offsets can arise from differences in how poorly documented changes in measurement characteristics are detected and adjusted. Building upon an existing pairwise homogenization algorithm used in generating the fourth version of NOAA’s GHCNm(V4), PHA0, we propose two revisions to account for autocorrelation in climate variables. One version, PHA1, makes minimal modification to PHA0by extending the threshold used in breakpoint detection to be a function of LSAT autocorrelation. The other version, PHA2, uses penalized likelihood to detect breakpoints through optimizing a model-selection problem globally. To facilitate efficient optimization for series with more than 1000 time steps, a multiparent genetic algorithm is proposed for PHA2. Tests on synthetic data generated by adding breakpoints to CMIP6 simulations and realizations from a Gaussian process indicate that PHA1and PHA2both similarly outperform PHA0in recovering accurate climatic trends. Applied to unhomogenized GHCNmV4, both revised algorithms detect breakpoints that correspond with available station metadata. Uncertainties are estimated by perturbing algorithmic parameters, and an ensemble is constructed by pooling 50 PHA1- and 50 PHA2-based members. The continental-mean warming in this new ensemble is consistent with that of Berkeley Earth, despite using different homogenization approaches. Relative to unhomogenized data, our homogenization increases the 1880–2022 trend by 0.16 [0.12, 0.19]°C century−1(95% confidence interval), leading to continental-mean warming of 1.65 [1.62, 1.69]°C over 2010–22 relative to 1880–1900. Significance StatementAccurately correcting for systematic errors in observational records of land surface air temperature (LSAT) is critical for quantifying historical warming. Existing LSAT estimates are subject to systematic offsets associated with processes including changes in instrumentation and station movement. This study improves a pairwise homogenization algorithm by accounting for the fact that climate signals are correlated over time. The revised algorithms outperform the original in identifying discontinuities and recovering accurate warming trends. Applied to monthly station temperatures, the revised algorithms adjust trends in continental mean LSAT since the 1880s to be 0.16°C century−1greater relative to raw data. Our estimate is most consistent with that from Berkeley Earth and indicates lesser and greater warming than estimates from NOAA and the Met Office, respectively.more » « less
- 
            Abstract Neural networks were previously applied to reconstruct climate indices from tree rings but showed mixed results in skill relative to more standard linear methods. A two‐layer neural network is explored for purposes of reconstructing summertime self‐calibrated Palmer Drought Severity Index (scPDSI) across the contiguous United States. Reconstructions using neural networks are more skillful than a linear approach at 75% of the gridboxes if evaluated by the coefficient of efficiency and at 54% when using the Pearson correlation coefficient. The increased reconstruction skill is related to the network capturing nonlinear growth‐climate relationships. In the Southwest, in particular, a nonlinear response function captures a diminishing sensitivity of growth to moisture under wetter conditions, consistent with alleviation of moisture stress. These results indicate somewhat less‐severe and more‐stable incidences of drought over the past two centuries in the U.S. Southwest.more » « less
- 
            Abstract How summertime temperature variability will change with warming has important implications for climate adaptation and mitigation. CMIP5 simulations indicate a compound risk of extreme hot temperatures in western Europe from both warming and increasing temperature variance. CMIP6 simulations, however, indicate only a moderate increase in temperature variance that does not covary with warming. To explore this intergenerational discrepancy in CMIP results, we decompose changes in monthly temperature variance into those arising from changes in sensitivity to forcing and changes in forcing variance. Across models, sensitivity increases with local warming in both CMIP5 and CMIP6 at an average rate of 5.7 ([3.7, 7.9]; 95% c.i.) × 10−3°C per W m−2per °C warming. We use a simple model of moist surface energetics to explain increased sensitivity as a consequence of greater atmospheric demand (∼70%) and drier soil (∼40%) that is partially offset by the Planck feedback (∼−10%). Conversely, forcing variance is stable in CMIP5 but decreases with warming in CMIP6 at an average rate of −21 ([−28, −15]; 95% c.i.) W2 m−4per °C warming. We examine scaling relationships with mean cloud fraction and find that mean forcing variance decreases with decreasing cloud fraction at twice the rate in CMIP6 than CMIP5. The stability of CMIP6 temperature variance is, thus, a consequence of offsetting changes in sensitivity and forcing variance. Further work to determine which models and generations of CMIP simulations better represent changes in cloud radiative forcing is important for assessing risks associated with increased temperature variance.more » « less
- 
            null (Ed.)Abstract Tree-ring chronologies underpin the majority of annually-resolved reconstructions of Common Era climate. However, they are derived using different datasets and techniques, the ramifications of which have hitherto been little explored. Here, we report the results of a double-blind experiment that yielded 15 Northern Hemisphere summer temperature reconstructions from a common network of regional tree-ring width datasets. Taken together as an ensemble, the Common Era reconstruction mean correlates with instrumental temperatures from 1794–2016 CE at 0.79 ( p < 0.001), reveals summer cooling in the years following large volcanic eruptions, and exhibits strong warming since the 1980s. Differing in their mean, variance, amplitude, sensitivity, and persistence, the ensemble members demonstrate the influence of subjectivity in the reconstruction process. We therefore recommend the routine use of ensemble reconstruction approaches to provide a more consensual picture of past climate variability.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
